Generalized information potential criterion for adaptive system training

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Generalized information potential criterion for adaptive system training

We have previously proposed the quadratic Renyi's error entropy as an alternative cost function for supervised adaptive system training. An entropy criterion instructs the minimization of the average information content of the error signal rather than merely trying to minimize its energy. In this paper, we propose a generalization of the error entropy criterion that enables the use of any order...

متن کامل

Convergence Analysis of the Information Potential Criterion in Adaline Training

In our recent studies we have proposed the use of minimum error entropy criterion as an alternative to minimum square error (MSE) in supervised adaptive system training. We have formulated a nonparametric estimator for Renyi’s entropy with the help of Parzen windowing. This formulation revealed interesting insights about the process of information theoretical learning. We have applied this new ...

متن کامل

Discriminative Adaptive Training Using the Mpe Criterion

This paper addresses the use of discriminative training criteria for Speaker Adaptive Training (SAT), where both the transform generation and model parameter estimation are estimated using the Minimum Phone Error (MPE) criterion. In a similar fashion to the use of I-smoothing for standard MPE training, a smoothing technique is introduced to avoid over-training when optimizing MPEbased feature-s...

متن کامل

Wilcoxon-type generalized Bayesian information criterion

We extend the basic idea of Schwarz (1978) and develop a generalized Bayesian information criterion for regression model selection. The new criterion relaxes the usually strong distributional assumption associated with Schwarz’s BIC by adopting a Wilcoxon-type dispersion function and appropriately adjusting the penalty term. We establish that the Wilcoxon-type generalized BIC preserves the cons...

متن کامل

Regularization Parameter Selections via Generalized Information Criterion.

We apply the nonconcave penalized likelihood approach to obtain variable selections as well as shrinkage estimators. This approach relies heavily on the choice of regularization parameter, which controls the model complexity. In this paper, we propose employing the generalized information criterion (GIC), encompassing the commonly used Akaike information criterion (AIC) and Bayesian information...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Neural Networks

سال: 2002

ISSN: 1045-9227

DOI: 10.1109/tnn.2002.1031936